Goto

Collaborating Authors

 physics-informed machine


Learning Biomolecular Motion: The Physics-Informed Machine Learning Paradigm

Deshpande, Aaryesh

arXiv.org Machine Learning

The convergence of statistical learning and molecular physics is transforming our approach to modeling biomolecular systems. Physics-informed machine learning (PIML) offers a systematic framework that integrates data-driven inference with physical constraints, resulting in models that are accurate, mechanistic, generalizable, and able to extrapolate beyond observed domains. This review surveys recent advances in physics-informed neural networks and operator learning, differentiable molecular simulation, and hybrid physics-ML potentials, with emphasis on long-timescale kinetics, rare events, and free-energy estimation. We frame these approaches as solutions to the "biomolecular closure problem", recovering unresolved interactions beyond classical force fields while preserving thermodynamic consistency and mechanistic interpretability. We examine theoretical foundations, tools and frameworks, computational trade-offs, and unresolved issues, including model expressiveness and stability. We outline prospective research avenues at the intersection of machine learning, statistical physics, and computational chemistry, contending that future advancements will depend on mechanistic inductive biases, and integrated differentiable physical learning frameworks for biomolecular simulation and discovery.


Physics-Informed Machine Learning On Polar Ice: A Survey

Liu, Zesheng, Koo, YoungHyun, Rahnemoonfar, Maryam

arXiv.org Artificial Intelligence

The mass loss of the polar ice sheets contributes considerably to ongoing sea-level rise and changing ocean circulation, leading to coastal flooding and risking the homes and livelihoods of tens of millions of people globally. To address the complex problem of ice behavior, physical models and data-driven models have been proposed in the literature. Although traditional physical models can guarantee physically meaningful results, they have limitations in producing high-resolution results. On the other hand, data-driven approaches require large amounts of high-quality and labeled data, which is rarely available in the polar regions. Hence, as a promising framework that leverages the advantages of physical models and data-driven methods, physics-informed machine learning (PIML) has been widely studied in recent years. In this paper, we review the existing algorithms of PIML, provide our own taxonomy based on the methods of combining physics and data-driven approaches, and analyze the advantages of PIML in the aspects of accuracy and efficiency. Further, our survey discusses some current challenges and highlights future opportunities, including PIML on sea ice studies, PIML with different combination methods and backbone networks, and neural operator methods.


Numerical analysis of physics-informed neural networks and related models in physics-informed machine learning

De Ryck, Tim, Mishra, Siddhartha

arXiv.org Artificial Intelligence

Physics-informed neural networks (PINNs) and their variants have been very popular in recent years as algorithms for the numerical simulation of both forward and inverse problems for partial differential equations. This article aims to provide a comprehensive review of currently available results on the numerical analysis of PINNs and related models that constitute the backbone of physics-informed machine learning. We provide a unified framework in which analysis of the various components of the error incurred by PINNs in approximating PDEs can be effectively carried out. A detailed review of available results on approximation, generalization and training errors and their behavior with respect to the type of the PDE and the dimension of the underlying domain is presented. In particular, the role of the regularity of the solutions and their stability to perturbations in the error analysis is elucidated. Numerical results are also presented to illustrate the theory. We identify training errors as a key bottleneck which can adversely affect the overall performance of various models in physics-informed machine learning.


An operator preconditioning perspective on training in physics-informed machine learning

De Ryck, Tim, Bonnet, Florent, Mishra, Siddhartha, de Bézenac, Emmanuel

arXiv.org Artificial Intelligence

In this paper, we investigate the behavior of gradient descent algorithms in physics-informed machine learning methods like PINNs, which minimize residuals connected to partial differential equations (PDEs). Our key result is that the difficulty in training these models is closely related to the conditioning of a specific differential operator. This operator, in turn, is associated to the Hermitian square of the differential operator of the underlying PDE. If this operator is ill-conditioned, it results in slow or infeasible training. Therefore, preconditioning this operator is crucial. We employ both rigorous mathematical analysis and empirical evaluations to investigate various strategies, explaining how they better condition this critical operator, and consequently improve training.


Estimating irregular water demands with physics-informed machine learning to inform leakage detection

Daniel, Ivo, Cominola, Andrea

arXiv.org Artificial Intelligence

Leakages in drinking water distribution networks pose significant challenges to water utilities, leading to infrastructure failure, operational disruptions, environmental hazards, property damage, and economic losses. The timely identification and accurate localisation of such leakages is paramount for utilities to mitigate these unwanted effects. However, implementation of algorithms for leakage detection is limited in practice by requirements of either hydraulic models or large amounts of training data. Physics-informed machine learning can utilise hydraulic information thereby circumventing both limitations. In this work, we present a physics-informed machine learning algorithm that analyses pressure data and therefrom estimates unknown irregular water demands via a fully connected neural network, ultimately leveraging the Bernoulli equation and effectively linearising the leakage detection problem. Our algorithm is tested on data from the L-Town benchmark network, and results indicate a good capability for estimating most irregular demands, with R2 larger than 0.8. Identification results for leakages under the presence of irregular demands could be improved by a factor of 5.3 for abrupt leaks and a factor of 3.0 for incipient leaks when compared the results disregarding irregular demands.


A physics-informed machine learning model for reconstruction of dynamic loads

Tondo, Gledson Rodrigo, Kavrakov, Igor, Morgenthal, Guido

arXiv.org Artificial Intelligence

Long-span bridges are subjected to a multitude of dynamic excitations during their lifespan. To account for their effects on the structural system, several load models are used during design to simulate the conditions the structure is likely to experience. These models are based on different simplifying assumptions and are generally guided by parameters that are stochastically identified from measurement data, making their outputs inherently uncertain. This paper presents a probabilistic physics-informed machine-learning framework based on Gaussian process regression for reconstructing dynamic forces based on measured deflections, velocities, or accelerations. The model can work with incomplete and contaminated data and offers a natural regularization approach to account for noise in the measurement system. An application of the developed framework is given by an aerodynamic analysis of the Great Belt East Bridge. The aerodynamic response is calculated numerically based on the quasi-steady model, and the underlying forces are reconstructed using sparse and noisy measurements. Results indicate a good agreement between the applied and the predicted dynamic load and can be extended to calculate global responses and the resulting internal forces. Uses of the developed framework include validation of design models and assumptions, as well as prognosis of responses to assist in damage detection and structural health monitoring.


Physics-informed machine learning

#artificialintelligence

Despite great progress in simulating multiphysics problems using the numerical discretization of partial differential equations (PDEs), one still cannot seamlessly incorporate noisy data into existing algorithms, mesh generation remains complex, and high-dimensional problems governed by parameterized PDEs cannot be tackled. Moreover, solving inverse problems with hidden physics is often prohibitively expensive and requires different formulations and elaborate computer codes. Machine learning has emerged as a promising alternative, but training deep neural networks requires big data, not always available for scientific problems. Instead, such networks can be trained from additional information obtained by enforcing the physical laws (for example, at random points in the continuous space-time domain). Such physics-informed learning integrates (noisy) data and mathematical models, and implements them through neural networks or other kernel-based regression networks. Moreover, it may be possible to design specialized network architectures that automatically satisfy some of the physical invariants for better accuracy, faster training and improved generalization. Here, we review some of the prevailing trends in embedding physics into machine learning, present some of the current capabilities and limitations and discuss diverse applications of physics-informed learning both for forward and inverse problems, including discovering hidden physics and tackling high-dimensional problems. The rapidly developing field of physics-informed learning integrates data and mathematical models seamlessly, enabling accurate inference of realistic and high-dimensional multiphysics problems. This Review discusses the methodology and provides diverse examples and an outlook for further developments.


When Physics Meets Machine Learning: A Survey of Physics-Informed Machine Learning

Meng, Chuizheng, Seo, Sungyong, Cao, Defu, Griesemer, Sam, Liu, Yan

arXiv.org Machine Learning

Machine learning/deep learning models have already achieved tremendous success in a number of domains such as computer vision [1, 2, 3, 4, 5] and natural language processing [6, 7, 8, 9, 10, 11, 12, 13, 14], where large amounts of training data and highly expressive neural network architectures together give birth to solutions outperforming previously dominating methods. As a consequence, researchers have also started exploring the possibility of applying machine learning models to advance scientific discovery and to further improve traditional analytical modeling [15, 16, 17, 18, 19, 20, 21]. While given a set of input and output pairs, deep neural networks are able to extract the complicated relations between the input and output via appropriate optimization over adequate large amount of data, prior knowledge still acts as an important role in finding the optimal solution. As the high level extraction of data distributions and task properties, prior knowledge, if incorporated properly, can provide rich information not existing or hard to extract in limited training data, and helps improve the data efficiency, the ability to generalize, and the plausibility of resulting models. Physics knowledge, which has been collected and validated explicitly both theoretically and empirically in the long history, contains tremendous abstraction and summary of natural phenomena and human behaviours in many important scientific and engineering applications. Thus in this paper, we focus on the topic of integrating prior physics knowledge into machine learning models, i.e. physics-informed machine learning (PIML). Compared to the integration of other types of prior knowledge, such as knowledge graphs, logic rules and human feedback [22], the integration of physics knowledge requires specific design due to its special properties and forms. In this paper, we survey a wide range of recent works in PIML and summarize them from three aspects.


AI learns physics to optimize particle accelerator performance

#artificialintelligence

Machine learning, a form of artificial intelligence, vastly speeds up computational tasks and enables new technology in areas as broad as speech and image recognition, self-driving cars, stock market trading and medical diagnosis. Before going to work on a given task, machine learning algorithms typically need to be trained on pre-existing data so they can learn to make fast and accurate predictions about future scenarios on their own. But what if the job is a completely new one, with no data available for training? Now, researchers at the Department of Energy's SLAC National Accelerator Laboratory have demonstrated that they can use machine learning to optimize the performance of particle accelerators by teaching the algorithms the basic physics principles behind accelerator operations--no prior data needed. "Injecting physics into machine learning is a really hot topic in many research areas--in materials science, environmental science, battery research, particle physics and more," said Adi Hanuka, a former SLAC research associate who led a study published in Physical Review Accelerator and Beams.